A new hybrid conjugate gradient algorithm for unconstrained optimization

Authors

  • J. Chen School of Mathematics and Statistics‎, ‎Southwest University‎, ‎Chongqing 400715‎, ‎P.R‎. ‎China.
  • J. Zhang School of Mathematics and Statistics‎, ‎Southwest University‎, ‎Chongqing 400715‎, ‎P.R‎. ‎China.
  • X. Han School of Mathematics and Statistics‎, ‎Southwest University‎, ‎Chongqing 400715‎, ‎P.R‎. ‎China.
Abstract:

In this paper, a new hybrid conjugate gradient algorithm is proposed for solving unconstrained optimization problems. This new method can generate sufficient descent directions unrelated to any line search. Moreover, the global convergence of the proposed method is proved under the Wolfe line search. Numerical experiments are also presented to show the efficiency of the proposed algorithm, especially for solving highly dimensional problems.

Upgrade to premium to download articles

Sign up to access the full text

Already have an account?login

similar resources

An Efficient Conjugate Gradient Algorithm for Unconstrained Optimization Problems

In this paper, an efficient conjugate gradient method for unconstrained optimization is introduced. Parameters of the method are obtained by solving an optimization problem, and using a variant of the modified secant condition. The new conjugate gradient parameter benefits from function information as well as gradient information in each iteration. The proposed method has global convergence und...

full text

A New Hybrid Conjugate Gradient Method Based on Eigenvalue Analysis for Unconstrained Optimization Problems

In this paper‎, ‎two extended three-term conjugate gradient methods based on the Liu-Storey ({tt LS})‎ ‎conjugate gradient method are presented to solve unconstrained optimization problems‎. ‎A remarkable property of the proposed methods is that the search direction always satisfies‎ ‎the sufficient descent condition independent of line search method‎, ‎based on eigenvalue analysis‎. ‎The globa...

full text

New hybrid conjugate gradient method for unconstrained optimization

Conjugate gradient methods are widely used for unconstrained optimization, especially large scale problems. Most of conjugate gradient methods don’t always generate a descent search direction, so the descent condition is usually assumed in the analyses and implementations. Dai and Yuan (1999) proposed the conjugate gradient method which generates a descent direction at every iteration. Yabe and...

full text

New Hybrid Conjugate Gradient Algorithms for Unconstrained Optimization

New hybrid conjugate gradient algorithms are proposed and analyzed. In these hybrid algorithms the famous parameter k β is computed as a convex combination of the Polak-Ribière-Polyak and Dai-Yuan conjugate gradient algorithms. In one hybrid algorithm the parameter in convex combination is computed in such a way that the conjugacy condition is satisfied, independent of the line search. In the o...

full text

A family of hybrid conjugate gradient methods for unconstrained optimization

Conjugate gradient methods are an important class of methods for unconstrained optimization, especially for large-scale problems. Recently, they have been much studied. This paper proposes a three-parameter family of hybrid conjugate gradient methods. Two important features of the family are that (i) it can avoid the propensity of small steps, namely, if a small step is generated away from the ...

full text

A New Descent Nonlinear Conjugate Gradient Method for Unconstrained Optimization

In this paper, a new nonlinear conjugate gradient method is proposed for large-scale unconstrained optimization. The sufficient descent property holds without any line searches. We use some steplength technique which ensures the Zoutendijk condition to be held, this method is proved to be globally convergent. Finally, we improve it, and do further analysis.

full text

My Resources

Save resource for easier access later

Save to my library Already added to my library

{@ msg_add @}


Journal title

volume 43  issue 6

pages  2067- 2084

publication date 2017-11-30

By following a journal you will be notified via email when a new issue of this journal is published.

Hosted on Doprax cloud platform doprax.com

copyright © 2015-2023